Stein Shrinkage with Penalization and Second Order Efficiency in Semiparametrics
نویسندگان
چکیده
Abstract: The problem of estimating the centre of symmetry of an unknown periodic function observed in Gaussian white noise is considered. Using the penalized blockwise Stein method, a smoothing filter allowing to define the penalized profile likelihood is proposed. The estimator of the centre of symmetry is then the maximizer of this penalized profile likelihood. This estimator is shown to be semiparametrically adaptive and efficient. Moreover, the second order term of its risk expansion is proved to behave at least as well as the second order term for the best possible estimator using monotone smoothing filter. Under mild assumptions, this estimator is shown to be second order minimax sharp adaptive over the whole scale of Sobolev balls with smoothness β > 1. Thus, these results improve on Dalalyan, Golubev and Tsybakov (2006), where β ≥ 2 is required.
منابع مشابه
James-Stein Shrinkage and Second Order Efficiency in Semiparametrics
Abstract: The problem of estimating the centre of symmetry of an unknown periodic function observed in Gaussian white noise is considered. Using the penalized blockwise James-Stein method, a smoothing filter allowing to define the penalized profile likelihood is proposed. The estimator of the centre of symmetry is then the maximizer of this penalized profile likelihood. This estimator is shown ...
متن کاملStein Shrinkage and Second-Order Efficiency for semiparametric estimation of the shift
The problem of estimating the shift (or, equivalently, the center of symmetry) of an unknown symmetric and periodic function f observed in Gaussian white noise is considered. Using the blockwise Stein method, a penalized profile likelihood with a data-driven penalization is introduced so that the estimator of the center of symmetry is defined as the maximizer of the penalized profile likelihood...
متن کاملShrinkage and Penalized Likelihood as Methods to Improve Predictive Accuracy
Hans C. van Houwelingen Saskia le Cessie Department of Medical Statistics, Leiden, The Netherlands P.O.Box 9604 2300 RC Leiden, The Netherlands email: [email protected] Abstract A review is given of shrinkage and penalization as tools to improve predictive accuracy of regression models. The James-Stein estimator is taken as starting point. Procedures covered are the Pre-test Estimation, ...
متن کاملSemiparametrics, Nonparametrics and Empirical Bayes Procedures in Linear Models
In a classical parametric setup, a key factor in the implementation of the Empirical Bayes methodology is the incorporation of a suitable prior that is compatible with the parametric setup and yet lends to the estimation of the Bayes (shrinkage) factor in an empirical manner. The situation is more complex in semi-parametric and (ev,:,n more in) nonparametric models. Although the Dirichlet prior...
متن کاملLeast Absolute Shrinkage is Equivalent to Quadratic Penalization
Adaptive ridge is a special form of ridge regression, balancing the quadratic penalization on each parameter of the model. This paper shows the equivalence between adaptive ridge and lasso (least absolute shrinkage and selection operator). This equivalence states that both procedures produce the same estimate. Least absolute shrinkage can thus be viewed as a particular quadratic penalization. F...
متن کامل